Fastest rates for stochastic mirror descent methods

نویسندگان

چکیده

Relative smoothness—a notion introduced in Birnbaum et al. (Proceedings of the 12th ACM conference on electronic commerce, ACM, pp 127–136, 2011) and recently rediscovered Bauschke (Math Oper Res 330–348, 2016) Lu (Relatively-smooth convex optimization by first-order methods, applications, arXiv:1610.05708 , 2016)—generalizes standard smoothness typically used analysis gradient type methods. In this work we are taking ideas from well studied field stochastic using them order to obtain faster algorithms for minimizing relatively smooth functions. We propose analyze two new algorithms: Randomized Coordinate Descent (relRCD) Stochastic Gradient (relSGD), both generalizing famous setting. The methods can be fact seen as particular instances mirror descent algorithms, which has been usually analyzed under stronger assumptions: Lipschitzness objective strong convexity reference function. As a consequence, one proposed relRCD corresponds first variant algorithm with linear convergence rate.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fastest Rates for Stochastic Mirror Descent Methods

Relative smoothness a notion introduced in [6] and recently rediscovered in [3, 18] generalizes the standard notion of smoothness typically used in the analysis of gradient type methods. In this work we are taking ideas from well studied field of stochastic convex optimization and using them in order to obtain faster algorithms for minimizing relatively smooth functions. We propose and analyze ...

متن کامل

Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization

In this paper, we present a new stochastic algorithm, namely the stochastic block mirror descent (SBMD) method for solving large-scale nonsmooth and stochastic optimization problems. The basic idea of this algorithm is to incorporate the block-coordinate decomposition and an incremental block averaging scheme into the classic (stochastic) mirror-descent method, in order to significantly reduce ...

متن کامل

Active Model Aggregation via Stochastic Mirror Descent

We consider the problem of learning convex aggregation of models, that is as good as the best convex aggregation, for the binary classification problem. Working in the stream based active learning setting, where the active learner has to make a decision on-the-fly, if it wants to query for the label of the point currently seen in the stream, we propose a stochastic-mirror descent algorithm, cal...

متن کامل

Mirror descent in non-convex stochastic programming

In this paper, we examine a class of nonconvex stochastic optimization problems which we call variationally coherent, and which properly includes all quasi-convex programs. In view of solving such problems, we focus on the widely used stochastic mirror descent (SMD) family of algorithms, and we establish that the method’s last iterate converges with probability 1. We further introduce a localiz...

متن کامل

Semi-Stochastic Gradient Descent Methods

In this paper we study the problem of minimizing the average of a large number (n) of smooth convex loss functions. We propose a new method, S2GD (Semi-Stochastic Gradient Descent), which runs for one or several epochs in each of which a single full gradient and a random number of stochastic gradients is computed, following a geometric law. The total work needed for the method to output an ε-ac...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computational Optimization and Applications

سال: 2021

ISSN: ['0926-6003', '1573-2894']

DOI: https://doi.org/10.1007/s10589-021-00284-5